1,695 research outputs found

    A TSSA algorithm based approach to enhance the performance of warehouse system

    Get PDF
    In this plethora of increased competitiveness and globalization the effective management of the warehouse system is a challenging task. Realizing that proper scheduling of the warehouses is necessary to outperform the competitors on cost, lead time, and customer service basis (Koster, 1998); the proposed research focuses on optimization of warehouse scheduling problems. This research aims to minimize the total tardiness so that the overall time involved in managing the inventory inside the warehouse could be effectively reduced. This research also deals with the vehicle routing issues in the warehousing scenario and considers various constraints, and decision variables, directly influencing the undertaken objective so as to make the model more realistic to the real world environment. The authors have also proposed a hybrid tabu sample-sort simulated annealing (TSSA) algorithm to reduce the tardiness as well as to enhance the performance of the warehousing system. The proposed TSSA algorithm inherits the merits of the tabu search and sample-sort annealing algorithm. The comparative analysis of the results of the TSSA algorithm with other algorithms such as simulated annealing (SA), tabu search (TS), and hybrid tabu search algorithms indicates its superiority over others, both in terms of computational time as well as total tardiness reduction

    Estimating Smooth Transition Autoregressive Models with GARCH Errors in the Presence of Extreme Observations and Outliers,

    Get PDF
    This paper investigates several empirical issues regarding quasimaximum likelihood estimation of Smooth Transition Autoregressive (STAR) models with GARCH errors, specifically STAR-GARCH and STAR-STGARCH. Convergence, the choice of different algorithms for maximising the likelihood function, and the sensitivity of the estimates to outliers and extreme observations, are examined using daily data for S&P 500, Heng Seng and Nikkei 225 for the period January 1986 to April 2000.

    "On the Structure, Asymptotic Theory and Applications of STAR-GARCH Models"

    Get PDF
    Non-linear time series models, especially regime-switching models, have become increasingly popular in the economics, finance and financial econometrics literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the models or asymptotic theory. Some structural and statistical properties have recently been established for the Smooth Transition Autoregressive (STAR) - Generalised Autoregresssive Conditional Heteroscedasticity (GARCH), or STAR-GARCH, model, including the necessary and sufficient conditions for the existence of moments, and the sufficient condition for consistency and asymptotic normality of the (Quasi)-Maximum Likelihood Estimator ((Q)MLE). While these moment conditions are straightforward to verify in practice, they may not be satisfied for the GARCH model if the underlying long run persistence is close to unity. A less restrictive condition for consistency and asymptotic normality may alleviate this problem. The paper establishes a weak sufficient, or log-moment, condition for consistency and asymptotic normality of (Q)MLE for STAR-GARCH. This condition can easily be extended to any non-linear conditional mean model with GARCH errors, subject to reasonable regularity conditions. Although the log-moment condition cannot be verified as easily as the second and fourth moment conditions, it allows the long run persistence of the GARCH process to exceed one. Monte Carlo experiments show that the log-moment condition is more reliable in practice than the second and fourh moment conditions when the underlying long run persistence is close to unity. These experiments also show that the correct specification of the conditional mean is crucial in obtaining unbiased estimates for the GARCH component. The sufficient conditions for consistency and asymptotic normality are verified empirically using S&P 500 returns, 3-month US Treasury Bill returns, and exchange rates between Australia and the USA. The effects of outliers and extreme observations on the empirical moment conditions are also analysed in detail.

    A hybrid CFGTSA based approach for scheduling problem: a case study of an automobile industry

    Get PDF
    In the global competitive world swift, reliable and cost effective production subject to uncertain situations, through an appropriate management of the available resources, has turned out to be the necessity for surviving in the market. This inspired the development of the more efficient and robust methods to counteract the existing complexities prevailing in the market. The present paper proposes a hybrid CFGTSA algorithm inheriting the salient features of GA, TS, SA, and chaotic theory to solve the complex scheduling problems commonly faced by most of the manufacturing industries. The proposed CFGTSA algorithm has been tested on a scheduling problem of an automobile industry, and its efficacy has been shown by comparing the results with GA, SA, TS, GTS, and hybrid TSA algorithms

    An ESPC algorithm based approach to solve inventory deployment problem

    Get PDF
    Global competitiveness has enforced the hefty industries to become more customized. To compete in the market they are targeting the customers who want exotic products, and faster and reliable deliveries. Industries are exploring the option of satisfying a portion of their demand by converting strategically placed products, this helps in increasing the variability of product produced by them in short lead time. In this paper, authors have proposed a new hybrid evolutionary algorithm named Endosymbiotic-Psychoclonal (ESPC) algorithm to determine the amount and type of product to stock as a semi product in inventory. In the proposed work the ability of previously proposed Psychoclonal algorithm to exploit the search space has been increased by making antibodies and antigen more cooperative interacting species. The efficacy of the proposed algorithm has been tested on randomly generated datasets and the results obtained, are compared with other evolutionary algorithms such as Genetic Algorithm (GA) and Simulated Annealing (SA). The comparison of ESPC with GA and SA proves the superiority of the proposed algorithm both in terms of quality of the solution obtained, and convergence time required to reach the optimal /near optimal value of the solution

    "Structure and Asymptotic Theory for Multivariate Asymmetric Volatility: Empirical Evidence for Country Risk Ratings"

    Get PDF
    Following the rapid growth in the international debt of less developed countries in the 1970s and the increasing incidence of debt rescheduling in the early 1980s, country risk has become a topic of major concern for the international financial community. A critical assessment of country risk is essential because it reflects the ability and willingness of a country to service its financial obligations. Various risk rating agencies employ different methods to determine country risk ratings, combining a range of qualitative and quantitative information regarding alternative measures of economic, financial and political risk into associated composite risk ratings. This paper provides an international comparison of country risk ratings compiled by the International Country Risk Guide (ICRG), which is the only international rating agency to provide detailed and consistent monthly data over an extended period for a large number of countries. As risk ratings can be treated as indexes, their rate of change, or returns, merits attention in the same manner as financial returns. For this reason, a constant correlation multivariate asymmetric ARMA-GARCH model is presented and its underlying structure is established, including the unique, strictly stationary and ergodic solution of the model, its causal expansion, and convenient sufficient conditions for the existence of moments. Alternative empirically verifiable sufficient conditions for the consistency and asymptotic normality of the quasi-maximum likelihood estimator are established under non-normality of the conditional (or standardized) shocks. The empirical results provide a comparative assessment of the conditional means and volatilities associated with international country risk returns across countries and over time, enable a validation of the regularity conditions underlying the models, highlight the importance of economic, financial and political risk ratings as components of a composite risk rating, evaluate the multivariate effects of alternative risk returns and different countries, and evaluate the usefulness of the ICRG risk ratings in modelling risk returns.

    Modelling mitochondrial epilepsy in vitro :conceptualisation, mechanisms, and therapeutic implications

    Get PDF
    PhD ThesisUp to a third of patients with mitochondrial disease exhibit epilepsy which is difficult to control with existing pharmacotherapies. Anti-epileptic drug development in this field has stalled due to a paucity of pre-clinical models. To address this, I developed a novel in vitro model of mitochondrial epilepsy. The main features of the model are respiratory chain complex I and IV inhibition as well as astrocytic aconitase inhibition using pharmacological agents. In this way, I observed epileptiform discharges in rodent and murine hippocampal brain slices. Using immunohistochemical techniques, I confirmed findings from human neuropathological studies in that epileptic slices showed a selective loss of GABAergic interneurons, sparing of pyramidal neurons, and profound astrogliosis. To demonstrate the models clinical relevancy, I illustrated the model’s predictive validity by observing that epileptic activity was unaffected by various antiepileptic drugs. These studies did reveal that there was an involvement of AMPA and GABAA receptors in the generation of the epileptiform discharges. Further experiments also implicated the astroglial glutamate-glutamine cycle in the process of epileptogenesis. Using metabolic tracing experiments, glutamine was observed to be depleted during the epileptic state. Glutamine supplementation sustained the synthesis of GABA in the epileptic tissue, presumably to restore metabolic homeostasis. Finally, aerobic respiration was inhibited alongside the upregulation of anaerobic glycolysis during the epileptic state. To dissect the role of glutamine in the modulation of mitochondrial respiration, I isolated neuronal and astrocytic populations and performed measurements of metabolic flux during the induction of seizure activity. Glutamine was able to rescue partial mitochondrial respiratory chain inhibition selectively in the astrocytes. Overall, I have developed an in vitro model of mitochondrial epilepsy and showed the interaction between astrocytes and neurons is prerequisite for seizure generation. Several pharmacological targets have emerged from these studies which may provide future novel therapeutic targets for this condition.EPSRC Industrial CASE Award (EP/K50499X/1) studentship in collaboration with GlaxoSmithKline that primarily funds this study. I would also like to acknowledge HelloBio for the travel grant for attending the Society for Neuroscience Meeting to present this work and the Network of European Neuroscience Schools (NENS) Exchange Grant for funding the collaborative work done with University of Copenhagen. I would also like to acknowledge Eisai and The Wellcome Trust Biomedical Vacation Scholarships grant for the contribution towards the perampanel study

    It Pays to Violate: How Effective are the Basel Accord Penalties?

    Get PDF
    The internal models amendment to the Basel Accord allows banks to use internal models to forecast Value-at-Risk (VaR) thresholds, which are used to calculate the required capital that banks must hold in reserve as a protection against negative changes in the value of their trading portfolios. As capital reserves lead to an opportunity cost to banks, it is likely that banks could be tempted to use models that underpredict risk, and hence lead to low capital charges. In order to avoid this problem the Basel Accord introduced a backtesting procedure, whereby banks using models that led to excessive violations are penalised through higher capital charges. This paper investigates the performance of five popular volatility models that can be used to forecast VaR thresholds under a variety of distributional assumptions. The results suggest that, within the current constraints and the penalty structure of the Basel Accord, the lowest capital charges arise when using models that lead to excessive violations, thereby suggesting the current penalty structure is not severe enough to control risk management. In addition, an alternative penalty structure is suggested to be more effective in aligning the interests of banks and regulators.

    "It Pays to Violate: How Effective are the Basel Accord Penalties?"

    Get PDF
    The internal models amendment to the Basel Accord allows banks to use internal models to forecast Value-at-Risk (VaR) thresholds, which are used to calculate the required capital that banks must hold in reserve as a protection against negative changes in the value of their trading portfolios. As capital reserves lead to an opportunity cost to banks, it is likely that banks could be tempted to use models that underpredict risk, and hence lead to low capital charges. In order to avoid this problem the Basel Accord introduced a backtesting procedure, whereby banks using models that led to excessive violations are penalised through higher capital charges. This paper investigates the performance of five popular volatility models that can be used to forecast VaR thresholds under a variety of distributional assumptions. The results suggest that, within the current constraints and the penalty structure of the Basel Accord, the lowest capital charges arise when using models that lead to excessive violations, thereby suggesting the current penalty structure is not severe enough to control risk management. In addition, an alternative penalty structure is suggested to be more effective in aligning the interests of banks and regulators.

    Stability Tests for Heterogeneous Panel Data

    Get PDF
    This paper proposes a new test for structural instability in heterogeneous panels. The test builds on the seminal work of Andrews (2003) originally developed for time series. It is robust to non-normal, heteroskedastic and serially correlated errors, and allows for the number of post break observations to be small. Importantly, the test considers the alternative of a break affecting only some - and not all - individuals of the panel. Under mild assumptions the test statistic is shown to be asymptotically normal, thanks to the additional cross sectional dimension of panel data. This greatly facilitates the calculation of critical values. Monte Carlo experiments show that the test has good size and power under a wide range of circumstances. The test is then applied to investigate the effect of the Euro on trade.Structural change, end-of-sample instability tests, heterogeneous panels, Monte Carlo, Euro effect on trade.
    • 

    corecore